“Never stop testing, and your advertising will never stop improving.”—David Ogilvy, Ogilvy & Mather
David Ogilvy, founder of Ogilvy, Benson & Mather, was 38 years old before he went into advertising. Yet, he became one of the founding fathers of modern advertising. Today, Ogilvy & Mather is one of the most recognized agencies in the world.
How did he start so late, yet still succeed where so many others failed?
By doing a lot of A/B testing.
The oft-told story is that Ogilvy was, of all things, a farmer before he went off to start his agency in NYC. But what most people don’t know (or forget) is that Ogilvy spent 10 years working directly under George Gallup, the survey titan, at the Audience Research Institute.
During his time there, he came to deeply appreciate the insights offered by both split testing and survey-based marketing. His dedication to making decisions based on data was arguably what set him apart from his peers and allowed his agency to grow explosively. As Ogilvy liked to put it, “Why should a manufacturer bet his money, perhaps the future of his company, on your instinct?”
“We should use the A/B testing methodology a lot more than we do today.”—Bill Gates, Microsoft
Bill Gates is right, you know. We really don’t A/B test today as much as we should. Some marketers do (for example, marketing automation companies love to split test), but let’s be honest, most agencies still don’t. And part of that is because most people find testing, and its resulting data, very intimidating. But it doesn’t have to be this way.
In an attempt to simplify the thousands of tips on A/B testing (how to do it, when to do it, where to do it, etc.), I’ve come up with seven simple tips you can follow—whether you’re a seasoned testing veteran, or a beginner—to perform insightful A/B tests.
Tip #1: Get the data
“The best lesson I learned about boosting online conversions … is that you just can’t create a ton of A/B tests and expect to boost your conversion rate. You need to gather both qualitative and quantitative data, analyze it, figure out what changes to make, and then run A/B tests.” —Neil Patel, Quick Sprout
Aside from being the founder of Crazy Egg, Kissmetrics and Quick Sprout, Patel also runs one of the most popular marketing blogs. An insight from a short post he wrote on A/B testing that stood out to me was his advice on the “A/A” testing that should always precede A/B testing.
In other words, he pointed out the fact that a lot of split testing campaigns start out with a flawed premise that the testing method is accurate. Of course, that may not be the case, which is why he stresses that marketers should first test out their convictions with known data sets and results to see if their testing methods are as valuable as they think they are.
Tip #2: Keep it simple
“I don’t need 100 words. I need 4 words: One page, one goal.”—Derek Halpern, Social Triggers
We all know what K.I.S.S. stands for, yet we all stubbornly ignore what we know to be true. Usually it’s due to insecurity about our content or product (e.g., thinking that if I just add one more line about why my product is better than the competition’s, then it actually will be better.) Sometimes, though, it’s because we think we know better—when we actually don’t.
The greatest enemy of good marketing is ego. Relying on gut instinct is a celebrated character trait in stories, but not in real life. The Scientific and Industrial Revolutions weren’t sparked by brash men and women who followed their gut instincts, but by scientists who followed an unbiased scientific method.
For example, read the takeaways from this Skype page design A/B/C test.
Of the three treatments, the simplest one won by a significant margin.
But the value of simplicity isn’t always so obvious or intuitive. A review of several oft-repeated A/B test results reminds us that call-to-action (CTA) buttons shouldn’t stand out too much, and that too many social features can actually hurt sales.
Tip #3: Never underestimate the importance of words
“The great enemy of clear language is insincerity. When there is a gap between one’s real and one’s declared aims, one turns as it were instinctively to long words and exhausted idioms, like a cuttlefish spurting out ink.”—George Orwell, Politics and the English Language
Although George Orwell wasn’t a marketer, the quote above is applicable to business writing. We can all recognize bad business writing by its insincerity. Big words and jargon are two of the biggest red flags in business writing. “Wordy” is not an adjective that you want anyone using to describe your sales copy.
But, in an effort to avoid superfluousness, the mistake many marketers make is that they become too obscure. “Mysterious” is not an adjective that you want anyone using to describe your sales copy, either.
A great example of this is the A/B test Marketing Experiments did on the subject line for one of their newsletter emails. By changing “5 B2B Social Media Career Killers … And How to Overcome Them” to “How to Overcome the 5 Most Common Social Media Mistakes in B2B Marketing,” they increased the clickthrough rate (CTR) by 45 percent.
In general, a great rule of thumb for business writing is that readers care less about how your product is different from a competitor’s product and more about what your product can do for them. And the best way to tell them what your product can do is to be clear and concise. Ogilvy confided in Confessions of an Advertising Man that he would often sell average products simply by listing their benefits (because the competition wasn’t doing it for their products.)
But conciseness and clarity aren’t everything. Your customer’s psychology is an incredibly important factor that’s often overlooked. Sometimes, even changing a single word can make a huge difference to conversion rates.
Don’t believe me? Check out this VWO case study, in which they changed “Request a quote” to “Request pricing.”
It was the only change they made, but it upped the CTR by 161 percent.
Tip #4: Test colors, shapes, and sizes, and places
“Visual content, like all forms of content needs to serve a purpose. If you are creating content of any form simply for the sake of making it, you’re missing the point … and ultimately missing the mark.”—Brent Csutoras, Kairay Media
Although marketers have decades of A/B testing copy under their belts, we have historically been terrible when it comes to testing for other elements of an ad or web page. Like the colors of titles and buttons, the size and shape of logos, pictures and CTAs—or even the placement of copy and white space. Too often, the design of a page is a stylistic choice rather than a data-driven one.
Part of the problem is that it’s harder to measure the impact of such changes. Although innovative tools like Crazy Egg, which lets you see a “heat map” of your customers clicks, has greatly helped with visual testing, the chances of getting a false positive are still much higher.
The only real way to methodically test the visual elements of a web page is to A/B test each part of its separately, and then wait for the results. Obviously, no one has time to do that on any launch timeline.
That’s why you have to rely on generally-accepted visual marketing theory (e.g., color psychology or placement psychology) and an understanding of your buyer personas.
A great example of a successful A/B test that tested for the overall impact of revised copy and a re-designed webpage is this case study from Marketing Experiments.
Treatment A
Treatment B
Treatment A had a conversion rate of 2.4 percent, while Treatment B had a conversion rate of 8.8 percent, an increase of 262.3 percent.
Tip #5: Optimize for revenue, not conversions
“It’s much easier to double your business by doubling your conversion rate than by doubling your traffic.”—Jeff Eisenberg, Buyer Legends
By the same token, we could also say that it’s much easier to double business by doubling revenue rather than by doubling conversion rates.
It’s not hocus pocus, it’s the truth of A/B testing optimization. If you could choose between optimizing a landing page to give you more conversions or more revenue, which would you choose? Obviously the latter, and that’s why it’s so important to recognize that those two optimizations are not always going to look the same.
A great example of this is VWO case study on how Server Density changed their pricing model. They were originally offering a pay-by-server pricing model, in which customers paid more for each server they added. They wanted to experiment with a packaged pricing model instead. Here’s what their pricing A/B test looked like:
That they discovered was that, although their free signup conversion rate dropped by 24.96 percent, their revenue increased by 114 percent. Certainly not intuitive, and also certainly not something they would have ever figured out without smart A/B testing.
Tip #6: Don’t be afraid to rely on your customers (i.e., customer survey data)
The single best lesson we’ve learned with improving conversions is that you must survey your audience. If you survey your audience, you get to understand the minds of your visitor and then can alter your marketing message accordingly.”—AJ Kumar, Single Grain
Surveys, like A/B testing, can be intimidating. So a lot of people just don’t do them. This is ill-advised.
As I already mentioned, David Ogilvy built the fastest-growing, most impressive rags-to-riches advertising agency on Madison Avenue because he firmly believed in the power of customer surveys. Ten years of working under Dr. Gallup taught him that what customers want is all that really matters.
Case in point: Tommi Wolfe, president of The Startup Expert, realized her email marketing just wasn’t making the cut. She wanted to find out how she could make her message resonate more with her readers. Instead of A/B testing her messaging, she just asked her readers what she should do.
After identifying a receptive audience, thinking through her survey questions, and compiling and leveraging the results of her survey, she was able to increase her revenue by 600 percent!
Tip #7: Be patient
I could keep spitballing these eye-popping increases in CTR, conversions, and revenue—but I think I’ve already made my point about the value of smart A/B testing.
I’ll leave you with one final tip: Be patient.
A/B testing is not for the trigger-happy. The worst thing you can do to yourself when you A/B test is to act on results prematurely. Let your tests mature, ruminate on the data you collect, and act only in full confidence when there is enough supporting data to validate a decision.